New Variants of Newton's Method for Nonlinear Unconstrained Optimization Problems

نویسندگان

  • Vinay Kanwar
  • Kapil K. Sharma
  • Ramandeep Behl
چکیده

In this paper, we propose new variants of Newton’s method based on quadrature formula and power mean for solving nonlinear unconstrained optimization problems. It is proved that the order of convergence of the proposed family is three. Numerical comparisons are made to show the performance of the presented methods. Furthermore, numerical experiments demonstrate that the logarithmic mean Newton’s method outperform the classical Newton’s and other variants of Newton’s method. MSC: 65H05.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modify the linear search formula in the BFGS method to achieve global convergence.

<span style="color: #333333; font-family: Calibri, sans-serif; font-size: 13.3333px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: justify; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; background-color: #ffffff; text-dec...

متن کامل

Geometrically Constructed Families of Newton's Method for Unconstrained Optimization and Nonlinear Equations

One-parameter families of Newton’s iterative method for the solution of nonlinear equations and its extension to unconstrained optimization problems are presented in the paper. These methods are derived by implementing approximations through a straight line and through a parabolic curve in the vicinity of the root. The presented variants are found to yield better performance than Newton’s metho...

متن کامل

Preconditioned Krylov Subspace Methods in Nonlinear Optimization

One of the possible ways of solving general problems of constrained nonlinear optimization is to convert them into a sequence of unconstrained problems. Then the need arises to solve an unconstrained optimization problem reliably and efficiently. For this aim, Newton methods are usually applied, often in combination with sparse Cholesky decomposition. In practice, however, this approach may not...

متن کامل

The modified BFGS method with new secant relation ‎for unconstrained optimization problems‎

Using Taylor's series we propose a modified secant relation to get a more accurate approximation of the second curvature of the objective function. Then, based on this modified secant relation we present a new BFGS method for solving unconstrained optimization problems. The proposed method make use of both gradient and function values while the usual secant relation uses only gradient values. U...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Intelligent Information Management

دوره 2  شماره 

صفحات  -

تاریخ انتشار 2010